On the Convergence Properties of the Hopfield Model
نویسنده
چکیده
The main contribution is showing that the known convergence properties of the Hopfield model can be reduced to a very simple case, for which we have an elementary proof. The convergence properties of the Hopfield model are dependent on the structure of the interconnections matrix W and the method by which the nodes are updated. Three cases are known: (1) convergence to a stable state when operating in a serial mode with symmetric W, (2) convergence to a cycle of length at most 2 when operating in a fully parallel mode with symmetric W, and (3) convergence to a cycle of length 4 when operating in a fully parallel mode with antisymmetric W. We review the three known results and prove that the fully parallel mode of operation is a special case of the serial mode of operation, for which we present an elementary proof. The elementary proof (one which does not involve the concept of an energy function) follows from the relations between the model and cuts in the graph. We also prove that the three known cases are the only interesting ones by exhibiting exponential lower bounds on the length of the cycles in the other cases.
منابع مشابه
Convergence Analysis of Hopfield Neural Networks
84 Abstract—In this paper, we analyze the convergence and stability properties of Hopfield Neural Networks (HNN). The global convergence and asymptotic stability of HNN have successful various applications in computing and optimization problems. After determining the mathematical model of the network, we do some analysis on the model. This analysis base on Lyapunov Stability Theorem. Firstly, w...
متن کاملA new model of (I+S)-type preconditioner for system of linear equations
In this paper, we design a new model of preconditioner for systems of linear equations. The convergence properties of the proposed methods have been analyzed and compared with the classical methods. Numerical experiments of convection-diffusion equations show a good im- provement on the convergence, and show that the convergence rates of proposed methods are superior to the other modified itera...
متن کاملمحاسبه ظرفیت شبکه عصبی هاپفیلد و ارائه روش عملی افزایش حجم حافظه
The capacity of the Hopfield model has been considered as an imortant parameter in using this model. In this paper, the Hopfield neural network is modeled as a Shannon Channel and an upperbound to its capacity is found. For achieving maximum memory, we focus on the training algorithm of the network, and prove that the capacity of the network is bounded by the maximum number of the ortho...
متن کاملAn Analytical Model for Predicting the Convergence Behavior of the Least Mean Mixed-Norm (LMMN) Algorithm
The Least Mean Mixed-Norm (LMMN) algorithm is a stochastic gradient-based algorithm whose objective is to minimum a combination of the cost functions of the Least Mean Square (LMS) and Least Mean Fourth (LMF) algorithms. This algorithm has inherited many properties and advantages of the LMS and LMF algorithms and mitigated their weaknesses in some ways. The main issue of the LMMN algorithm is t...
متن کاملOn global stability of Hopfield neural networks with discontinuous neuron activations
The paper introduces a general class of neural networks where the neuron activations are modeled by discontinuous functions. The neural networks have an additive interconnecting structure and they include as particular cases the Hopfield neural networks (HNNs), and the standard Cellular Neural Networks (CNNs), in the limiting situation where the HNNs and CNNs possess neurons with infinite gain....
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2004